Minimum Error Entropy Algorithms with Sparsity Penalty Constraints

نویسندگان

  • Zongze Wu
  • Siyuan Peng
  • Wentao Ma
  • Badong Chen
  • José Carlos Príncipe
چکیده

Recently, sparse adaptive learning algorithms have been developed to exploit system sparsity as well as to mitigate various noise disturbances in many applications. In particular, in sparse channel estimation, the parameter vector with sparsity characteristic can be well estimated from noisy measurements through a sparse adaptive filter. In previous studies, most works use the mean square error (MSE) based cost to develop sparse filters, which is rational under the assumption of Gaussian distributions. However, Gaussian assumption does not always hold in real-world environments. To address this issue, we incorporate in this work an l1-norm or a reweighted l1-norm into the minimum error entropy (MEE) criterion to develop new sparse adaptive filters, which may perform much better than the MSE based methods, especially in heavy-tailed non-Gaussian situations, since the error entropy can capture higher-order statistics of the errors. In addition, a new approximator of l0-norm, based on the correntropy induced metric (CIM), is also used as a sparsity penalty term (SPT). We analyze the mean square convergence of the proposed new sparse adaptive filters. An energy conservation relation is derived and a sufficient condition is obtained, which ensures the mean square convergence. Simulation results confirm the superior performance of the new algorithms. OPEN ACCESS Entropy 2015, 17 3420

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

THE EFFECTS OF INITIAL SAMPLING AND PENALTY FUNCTIONS IN OPTIMAL DESIGN OF TRUSSES USING METAHEURISTIC ALGORITHMS

Although Genetic algorithm (GA), Ant colony (AC) and Particle swarm optimization algorithm (PSO) have already been extended to various types of engineering problems, the effects of initial sampling beside constraints in the efficiency of algorithms, is still an interesting field. In this paper we show that, initial sampling with a special series of constraints play an important role in the conv...

متن کامل

Recursive Generalized Maximum Correntropy Criterion Algorithm with Sparse Penalty Constraints for System Identification

To address sparse system identification problem in non-Gaussian impulsive noise environment, the recursive generalized maximum correntropy criterion (RGMCC) algorithm with sparse penalty constraints is proposed to combat impulsive-inducing instability. Specifically, a recursive algorithm based on the generalized correntropy with a forgetting factor of error is developed to improve the performan...

متن کامل

Boosting Classifiers with Tightened L0-Relaxation Penalties

We propose a novel boosting algorithm which improves on current algorithms for weighted voting classification by striking a better balance between classification accuracy and the sparsity of the weight vector. In order to justify our optimization formulations, we first consider a novel integer linear program as a model for sparse classifier selection, generalizing the minimum disagreement halfs...

متن کامل

Multiple Kernel Multi-Task Learning

Recently, there has been a lot of interest around multi-task learning (MTL) problem with the constraints that tasks should share a common sparsity profile. Such a problem can be addressed through a regularization framework where the regularizer induces a joint-sparsity pattern between task decision functions. We follow this principled framework and focus on lp−lq (with 0 ≤ p ≤ 1 and 1 ≤ q ≤ 2) ...

متن کامل

Structured sparsity with convex penalty functions

We study the problem of learning a sparse linear regression vector under additional conditions on the structure of its sparsity pattern. This problem is relevant in Machine Learning, Statistics and Signal Processing. It is well known that a linear regression can benefit from knowledge that the underlying regression vector is sparse. The combinatorial problem of selecting the nonzero components ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 17  شماره 

صفحات  -

تاریخ انتشار 2015